1,529 research outputs found

    Variational Dropout and the Local Reparameterization Trick

    Get PDF
    We investigate a local reparameterizaton technique for greatly reducing the variance of stochastic gradients for variational Bayesian inference (SGVB) of a posterior over model parameters, while retaining parallelizability. This local reparameterization translates uncertainty about global parameters into local noise that is independent across datapoints in the minibatch. Such parameterizations can be trivially parallelized and have variance that is inversely proportional to the minibatch size, generally leading to much faster convergence. Additionally, we explore a connection with dropout: Gaussian dropout objectives correspond to SGVB with local reparameterization, a scale-invariant prior and proportionally fixed posterior variance. Our method allows inference of more flexibly parameterized posteriors; specifically, we propose variational dropout, a generalization of Gaussian dropout where the dropout rates are learned, often leading to better models. The method is demonstrated through several experiments

    Specificity and Kinetics of Haloalkane Dehalogenase

    Get PDF
    Haloalkane dehalogenase converts halogenated alkanes to their corresponding alcohols. The active site is buried inside the protein and lined with hydrophobic residues. The reaction proceeds via a covalent substrate-enzyme complex. This paper describes a steady-state and pre-steady-state kinetic analysis of the conversion of a number of substrates of the dehalogenase. The kinetic mechanism for the “natural” substrate 1,2-dichloroethane and for the brominated analog and nematocide 1,2-dibromoethane are given. In general, brominated substrates had a lower Km, but a similar kcat than the chlorinated analogs. The rate of C-Br bond cleavage was higher than the rate of C-Cl bond cleavage, which is in agreement with the leaving group abilities of these halogens. The lower Km for brominated compounds therefore originates both from the higher rate of C-Br bond cleavage and from a lower Ks for bromo-compounds. However, the rate-determining step in the conversion (kcat) of 1,2-dibromoethane and 1,2-dichloroethane was found to be release of the charged halide ion out of the active site cavity, explaining the different Km but similar kcat values for these compounds. The study provides a basis for the analysis of rate-determining steps in the hydrolysis of various environmentally important substrates.

    Influence of mutations of Val226 on the catalytic rate of haloalkane dehalogenase

    Get PDF
    Haloalkane dehalogenase converts haloalkanes to their corresponding alcohols. The 3D structure, reaction mechanism and kinetic mechanism have been studied. The steady state kcat with 1,2-dichloroethane and 1,2-dibromoethane is limited mainly by the rate of release of the halide ion from the buried active-site cavity. During catalysis, the halogen that is cleaved off (Clα) from 1,2-dichloroethane interacts with Trp125 and the Clβ interacts with Phe172. Both these residues have van der Waals contacts with Val226. To establish the effect of these interactions on catalysis, and in an attempt to change enzyme activity without directly mutating residues involved in catalysis, we mutated Val226 to Gly, Ala and Leu. The Val226Ala and Val226Leu mutants had a 2.5-fold higher catalytic rate for 1,2-dibromoethane than the wild-type enzyme. A pre-steady state kinetic analysis of the Val226Ala mutant enzyme showed that the increase in kcat could be attributed to an increase in the rate of a conformational change that precedes halide release, causing a faster overall rate of halide dissociation. The kcat for 1,2-dichloroethane conversion was not elevated, although the rate of chloride release was also faster than in the wild-type enzyme. This was caused by a 3-fold decrease in the rate of formation of the alkyl-enzyme intermediate for 1,2-dichloroethane. Val226 seems to contribute to leaving group (Clα or Brα) stabilization via Trp125, and can influence halide release and substrate binding via an interaction with Phe172. These studies indicate that wild-type haloalkane dehalogenase is optimized for 1,2-dichloroethane, although 1,2-dibromoethane is a better substrate.

    Explaining Machine Learning Classifiers through Diverse Counterfactual Explanations

    Full text link
    Post-hoc explanations of machine learning models are crucial for people to understand and act on algorithmic predictions. An intriguing class of explanations is through counterfactuals, hypothetical examples that show people how to obtain a different prediction. We posit that effective counterfactual explanations should satisfy two properties: feasibility of the counterfactual actions given user context and constraints, and diversity among the counterfactuals presented. To this end, we propose a framework for generating and evaluating a diverse set of counterfactual explanations based on determinantal point processes. To evaluate the actionability of counterfactuals, we provide metrics that enable comparison of counterfactual-based methods to other local explanation methods. We further address necessary tradeoffs and point to causal implications in optimizing for counterfactuals. Our experiments on four real-world datasets show that our framework can generate a set of counterfactuals that are diverse and well approximate local decision boundaries, outperforming prior approaches to generating diverse counterfactuals. We provide an implementation of the framework at https://github.com/microsoft/DiCE.Comment: 13 page

    Recurrent Latent Variable Networks for Session-Based Recommendation

    Full text link
    In this work, we attempt to ameliorate the impact of data sparsity in the context of session-based recommendation. Specifically, we seek to devise a machine learning mechanism capable of extracting subtle and complex underlying temporal dynamics in the observed session data, so as to inform the recommendation algorithm. To this end, we improve upon systems that utilize deep learning techniques with recurrently connected units; we do so by adopting concepts from the field of Bayesian statistics, namely variational inference. Our proposed approach consists in treating the network recurrent units as stochastic latent variables with a prior distribution imposed over them. On this basis, we proceed to infer corresponding posteriors; these can be used for prediction and recommendation generation, in a way that accounts for the uncertainty in the available sparse training data. To allow for our approach to easily scale to large real-world datasets, we perform inference under an approximate amortized variational inference (AVI) setup, whereby the learned posteriors are parameterized via (conventional) neural networks. We perform an extensive experimental evaluation of our approach using challenging benchmark datasets, and illustrate its superiority over existing state-of-the-art techniques
    corecore